3,558 research outputs found

    Debates—Stochastic subsurface hydrology from theory to practice: why stochastic modeling has not yet permeated into practitioners?

    Get PDF
    This is the peer reviewed version of the following article: [Sanchez-Vila, X., and D. Fernàndez-Garcia (2016), Debates—Stochastic subsurface hydrology from theory to practice: Why stochastic modeling has not yet permeated into practitioners?, Water Resour. Res., 52, 9246–9258, doi:10.1002/2016WR019302], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2016WR019302/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-ArchivingWe address modern topics of stochastic hydrogeology from their potential relevance to real modeling efforts at the field scale. While the topics of stochastic hydrogeology and numerical modeling have become routine in hydrogeological studies, nondeterministic models have not yet permeated into practitioners. We point out a number of limitations of stochastic modeling when applied to real applications and comment on the reasons why stochastic models fail to become an attractive alternative for practitioners. We specifically separate issues corresponding to flow, conservative transport, and reactive transport. The different topics addressed are emphasis on process modeling, need for upscaling parameters and governing equations, relevance of properly accounting for detailed geological architecture in hydrogeological modeling, and specific challenges of reactive transport. We end up by concluding that the main responsible for nondeterministic models having not yet permeated in industry can be fully attributed to researchers in stochastic hydrogeology.Peer ReviewedPostprint (author's final draft

    Phenomenology of the detection of ultra-high energy cosmic rays and neutrinos using the radio technique

    Get PDF
    Ultra-high energy cosmic rays are particles that have energies up to 1020 and beyondd and that arrive to the Earth after travelling the Universe. These energies are more than one million times the energies availably by means of man-made accelerators. Cosmic rays pose several questions that remain unanswered, such as which is their composition at ultra-high energies, which are their sources (the regions of the Universe where they are produced), how they are accelerated, or how they interact with the medium while they propagate towards the Earth, etc. The existence of ultra-high energy cosmic rays that are protons or charged nuclei indicates that the production of neutrinos because of the interactions of the cosmic rays, limiting the distance the ultra-high energy cosmic rays can reach (GZK effect). On the other hand, neutrinos, being particles that interact only via weak force and with cross sections about 107 smaller than hadronic cross sections, can come from the edge of the Universe without deviating or interacting. This makes them extraordinary cosmic messengers. The detection methods of ultra-high energy cosmic rays and neutrinos involve the creation of particle showers from the interaction of the cosmic ray with a particle in a medium (atmosphere or ice, for instance). These showers are measured with detectors such as water tanks provided with photomultipliers, or fluorescence telescopes. Through the measurable quantities of a shower several properties of the initial particle can be inferred, like the energy, the type of particle, the arrival direction... One of the detection methods is the radio technique. This technique began to be developed in the 1960s, reaching some promising first results, but the limitations of the electronics at the time forced the research to stop. In the last years, and thanks to the advances in electronics, that now allows the measuring of voltages with temporal precision below the nanosecond, the radio technique is witnessing a renaissance, with experiments as ANITA, LOFAR, CODALEMA, ARA or ARIANNA. The basic idea of the radio technique is the following. When a cosmic ray or a neutrino collides with a material medium in the Earth, the resulting shower contains charge

    Stochastic estimation of hydraulic transmissivity fields using flow connectivity indicator data

    Get PDF
    This is the peer reviewed version of the following article: [Freixas, G., D. Fernàndez-Garcia, and X. Sanchez-Vila (2017), Stochastic estimation of hydraulic transmissivity fields using flow connectivity indicator data, Water Resour. Res., 53, 602–618, doi:10.1002/2015WR018507], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2015WR018507/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.Most methods for hydraulic test interpretation rely on a number of simplified assumptions regarding the homogeneity and isotropy of the underlying porous media. This way, the actual heterogeneity of any natural parameter, such as transmissivity ( math formula), is transferred to the corresponding estimates in a way heavily dependent on the interpretation method used. An example is a long-term pumping test interpreted by means of the Cooper-Jacob method, which implicitly assumes a homogeneous isotropic confined aquifer. The estimates obtained from this method are not local values, but still have a clear physical meaning; the estimated math formula represents a regional-scale effective value, while the log-ratio of the normalized estimated storage coefficient, indicated by math formula, is an indicator of flow connectivity, representative of the scale given by the distance between the pumping and the observation wells. In this work we propose a methodology to use math formula, together with sampled local measurements of transmissivity at selected points, to map the expected value of local math formula values using a technique based on cokriging. Since the interpolation involves two variables measured at different support scales, a critical point is the estimation of the covariance and crosscovariance matrices. The method is applied to a synthetic field displaying statistical anisotropy, showing that the inclusion of connectivity indicators in the estimation method provide maps that effectively display preferential flow pathways, with direct consequences in solute transport.Peer ReviewedPostprint (published version

    De retibus socialibus et legibus momenti

    Get PDF
    Online Social Networks (OSNs) are a cutting edge topic. Almost everybody --users, marketers, brands, companies, and researchers-- is approaching OSNs to better understand them and take advantage of their benefits. Maybe one of the key concepts underlying OSNs is that of influence which is highly related, although not entirely identical, to those of popularity and centrality. Influence is, according to Merriam-Webster, "the capacity of causing an effect in indirect or intangible ways". Hence, in the context of OSNs, it has been proposed to analyze the clicks received by promoted URLs in order to check for any positive correlation between the number of visits and different "influence" scores. Such an evaluation methodology is used in this paper to compare a number of those techniques with a new method firstly described here. That new method is a simple and rather elegant solution which tackles with influence in OSNs by applying a physical metaphor.Comment: Changes made for third revision: Brief description of the dataset employed added to Introduction. Minor changes to the description of preparation of the bit.ly datasets. Minor changes to the captions of Tables 1 and 3. Brief addition in the Conclusions section (future line of work added). Added references 16 and 18. Some typos and grammar polishe

    Assessing the joint impact of DNAPL source-zone behavior and degradation products on the probabilistic characterization of human health risk

    Get PDF
    The release of industrial contaminants into the subsurface has led to a rapid degradation of groundwater resources. Contamination caused by Dense Non-Aqueous Phase Liquids (DNAPLs) is particularly severe owing to their limited solubility, slow dissolution and in many cases high toxicity. A greater insight into how the DNAPL source zone behavior and the contaminant release towards the aquifer impact human health risk is crucial for an appropriate risk management. Risk analysis is further complicated by the uncertainty in aquifer properties and contaminant conditions. This study focuses on the impact of the DNAPL release mode on the human health risk propagation along the aquifer under uncertain conditions. Contaminant concentrations released from the source zone are described using a screening approach with a set of parameters representing several scenarios of DNAPL architecture. The uncertainty in the hydraulic properties is systematically accounted for by high-resolution Monte Carlo simulations. We simulate the release and the transport of the chlorinated solvent perchloroethylene and its carcinogenic degradation products in randomly heterogeneous porous media. The human health risk posed by the chemical mixture of these contaminants is characterized by the low-order statistics and the probability density function of common risk metrics. We show that the zone of high risk (hot spot) is independent of the DNAPL mass release mode, and that the risk amplitude is mostly controlled by heterogeneities and by the source zone architecture. The risk is lower and less uncertain when the source zone is formed mostly by ganglia than by pools. We also illustrate how the source zone efficiency (intensity of the water flux crossing the source zone) affects the risk posed by an exposure to the chemical mixture. Results display that high source zone efficiencies are counter-intuitively beneficial, decreasing the risk because of a reduction in the time available for the production of the highly toxic subspecies.Peer ReviewedPostprint (author's final draft

    A Highly Available Cluster of Web Servers with Increased Storage Capacity

    Get PDF
    Ponencias de las Decimoséptimas Jornadas de Paralelismo de la Universidad de Castilla-La Mancha celebradas el 18,19 y 20 de septiembre de 2006 en AlbaceteWeb servers scalability has been traditionally solved by improving software elements or increasing hardware resources of the server machine. Another approach has been the usage of distributed architectures. In such architectures, usually, file al- location strategy has been either full replication or full distribution. In previous works we have showed that partial replication offers a good balance between storage capacity and reliability. It offers much higher storage capacity while reliability may be kept at an equivalent level of that from fully replicated solutions. In this paper we present the architectural details of Web cluster solutions adapted to partial replication. We also show that partial replication does not imply a penalty in performance over classical fully replicated architectures. For evaluation purposes we have used a simulation model under the OMNeT++ framework and we use mean service time as a performance comparison metric.Publicad

    New system for measuring impact vibration on floor decking sheets

    Get PDF
    The present paper exposes an alternative system more simple and economic, consisting of a predefined beating device and a sensor able to determine, once produced the hit, the energy absorbed by the plate. After the impact being produced, the plate undergoes a deformation which absorbs part of the energy, being the reminding one transmitted to the slab and, at the same time, causing induced airborne noise in the adjoining room

    Toward efficiency in heterogeneous multispecies reactive transport modeling: A particle-tracking solution for first-order network reactions

    Get PDF
    Modeling multispecies reactive transport in natural systems with strong heterogeneities and complex biochemical reactions is a major challenge for assessing groundwater polluted sites with organic and inorganic contaminants. A large variety of these contaminants react according to serial-parallel reaction networks commonly simplified by a combination of first-order kinetic reactions. In this context, a random-walk particle tracking method is presented. This method is capable of efficiently simulating the motion of particles affected by first-order network reactions in three-dimensional systems, which are represented by spatially variable physical and biochemical coefficients described at high resolution. The approach is based on the development of transition probabilities that describe the likelihood that particles belonging to a given species and location at a given time will be transformed into and moved to another species and location afterward. These probabilities are derived from the solution matrix of the spatial moments governing equations. The method is fully coupled with reactions, free of numerical dispersion and overcomes the inherent numerical problems stemming from the incorporation of heterogeneities to reactive transport codes. In doing this, we demonstrate that the motion of particles follows a standard random walk with time-dependent effective retardation and dispersion parameters that depend on the initial and final chemical state of the particle. The behavior of effective parameters develops as a result of differential retardation effects among species. Moreover, explicit analytic solutions of the transition probability matrix and related particle motions are provided for serial reactions. An example of the effect of heterogeneity on the dechlorination of organic solvents in a threedimensional random porous media shows that the power-law behavior typically observed in conservative tracers breakthrough curves can be largely compromised by the effect of biochemical reactions.Postprint (published version

    Modelling an aggregate processing unit. Case study: Fornelo processing plant

    Get PDF
    O presente trabalho consiste no desenvolvimento de um modelo para estimar a produção, em massa relativa e granulometria, de uma instalação de processamento de agregados situada em Fornelo (concelho do Porto, Portugal). Assim, e previamente ao projecto desenvolvido, faz-se uma referência às técnicas de fragmentação de rocha com vista à redução de calibre dos fragmentos obtidos no desmonte, bem como das técnicas de classificação de partículas, crivagem e os seus principais modelos, assim como à importância dos agregados face às respectivas aplicações. Com a ajuda do software Microsoft Excel foi possível desenvolver e criar um modelo de cálculo para avaliar de forma precisa, por meio de diversas iterações, as quantidades e características granulométricas dos materiais britados e recirculados, assim como dos produtos finais obtidos no processamento. Para a sua elaboração foram tidos em conta os valores e parâmetros apresentados noutros trabalhos experimentais sobre processamento de agregados. Este documento pretende ser uma alternativa aos outros softwares, nomeadamente o PlantDesigner ou o Bruno (Metso), concebidos para o projeto e o controlo de instalações de produção de agregados. A ferramenta permitirá ao utilizador alterar os vários valores apresentados nos catálogos dos equipamentos, ou os destinos dos diferentes produtos finais e intermédios com as suas respectivas massas e granulometrias para obter os diversos produtos finais, permitindo um controle mais assertivo da quantidade e qualidade granulométrica dos produtos finais a comercializar.El presente trabajo de final de máster trata del proceso de diseño y cálculo de un sistema para la estimación de la producción en masa relativa y granulometría de una planta de procesamiento de agregados situada en Fornelo (Oporto, Portugal). Previo al proceso de diseño, se ha realizado un pequeño desglose de las técnicas de trituración, fuerzas aplicadas, así como de las técnicas de cribado y sus principales modelos, además de hablar de los agregados como recursos y sus múltiples usos. Con ayuda del programa Microsoft Excel, es posible calcular a través de diversas iteraciones las recirculaciones de material triturado con el fin de obtener los resultados finales de forma precisa. Para su elaboración se han tomado en cuenta valores y parámetros dados por otros trabajos experimentales sobre procesamiento de agregados. Este documento pretende servir como alternativa a otros softwares como PlantDesigner o Bruno, dedicados al diseño y control de plantas de producción de agregados. La herramienta permitirá al usuario alterar valores variables dados por los catálogos de los equipamientos o los destinos de los diferentes productos finales e intermedios con sus respectivas masas y granulometrías para variar los productos finales, permitiendo un control exhaustivo de la calidad y cantidad de la granulometría de los productosThis work contains the final master’s project, which is about the process of design and calculous of a computer system to the estimation of total relative mass and granulometry of an aggregates processing plant in Fornelo quarry (Porto, Portugal) Before the design process, it has been done a brief introduction of the crushing techniques and applied forces as well as the screening techniques and its principals’ models, also introducing the aggregates as resources and its multiples uses. With the help of the software Microsoft Excel, it is possible calculate over diverse iterations the recirculation of crushed material with the purpose of getting the final results precisely. For its preparation, is has been taken values and parameters given by other experimental projects of aggregates processing. This document tries to being an alternative for other softwares as PlantDesigner or Bruno, involved in the design and control of the production plants of aggregates. The tool will allow the users to modify variable values given by brochures and some of the destination of the different final or intermediate products with its respective masses and granulometries in order to vary the final pile products, allowing an exhaustive control of the quality and quantity of the final pile’s product granulometry
    corecore